# High-precision Q&A
7B
A 7B-parameter causal language model compatible with Meta LLaMA 2 architecture, outperforming similar models under 33B in multiple evaluations
Large Language Model
Transformers Supports Multiple Languages

7
CausalLM
177
135
Questionanswering V7
Apache-2.0
This is a Q&A system model trained on the SQuAD dataset, capable of answering questions based on given text.
Question Answering System
Transformers English

Q
abdalrahmanshahrour
14
1
Question Answering Roberta Base S V2
Apache-2.0
A RoBERTa-based question answering model specialized in inferring answer text, span, and confidence scores given a question and context.
Question Answering System
Transformers

Q
consciousAI
1,832
10
Deberta V3 Large Squad2
MIT
A Q&A model based on the DeBERTa-v3-large architecture, fine-tuned on the SQuAD 2.0 dataset, capable of handling both answerable and unanswerable questions.
Question Answering System
Transformers English

D
navteca
35
0
Roberta Base Cuad Finetuned
RoBERTa model optimized for the Contract Understanding Atticus Dataset (CUAD) Q&A task, excelling in legal contract review tasks
Question Answering System
Transformers English

R
gustavhartz
387
1
Albert Xxlarge V2 Squad2 Covid Qa Deepset
This model is a Q&A model fine-tuned on the SQuAD2.0 and COVID-QA datasets based on the ALBERT-xxlarge-v2 architecture
Question Answering System
Transformers

A
armageddon
19
0
Featured Recommended AI Models